perm filename THIRTN.MSS[BIB,CSR]2 blob sn#609199 filedate 1981-08-28 generic text, type T, neo UTF8
@make(PublicationOrderForm,Number 13,AnnouncementDate "August 1981")

@begin[center]
List Number 13
@end[center]

@Entry(Code "STAN-CS-81-855",
	Title "Deductive Synthesis of the Unification Algorithm",
	Author "Zohar Manna and Richard Waldinger",
	Price "$3.55", FREE,
	Note "51 pages",
	Date "June 1981")

The @i[deductive approach] is a formal program-construction method in which the
derivation of a program from a given specification is regarded as a theorem-proving
task.  To construct a program whose output satisfies the conditions of the 
specification, we prove a theorem stating the existence of such an output.  The
proof is restricted to be sufficiently constructive so that a program computing
the desired output can be extracted directly from the proof.  The program we obtain
is applicative and may consist of several mutually recursive procedures.  The proof
constitutes a demonstration of the correctness of this program.

To exhibit the full power of the deductive approach, we apply it to a nontrivial
example--the synthesis of a @i[unification algorithm].  Unification is the process 
of finding a common instance of two expressions.  Algorithms to perform unification
have been central to many theorem-proving systems and some programming-language
processors.

The task of deriving a unification algorithm automatically is beyond the power of
existing program-synthesis systems.  In this paper, we use the deductive approach to
derive an algorithm from a simple, high-level specification of the unification task.
We will identify some of the capabilities required of a theorem-proving system to
perform this derivation automatically.

@Entry(Code "STAN-CS-81-856",
	Title "An Organization for Programs in Fluid Domains",
	Author "Richard Paul Gabriel",
	Price  "",FREE,
	Note "190 pages",
	Date "May 1981")

An organization for very large artificial intelligence programs is
presented with a detailed example in the area of natural language
generation. The concept of a @i[fluid domain] is defined and shown to
introduce constraints on the type of program organization required in such
domains.

The organization is a community of @i[stratifications] which
communicate with each other in a fairly loose manner.  Each stratification
is a hierarchy of individuals, called @i[units], organized in
meta-object/object relationships; these relationships allow observation
and description of some units by others.  It is shown that a great deal of
power obtains from organizations of this type and from the interactions
between the objects that exist in such systems. Further, this organization
is demonstrated to be a valuable method of viewing both intelligence and
the programming structures needed to produce intelligence.


@Entry(Code "STAN-CS-81-857",
	Title "Query Optimization by Semantic Reasoning",
	Author "Jonathan Jay King",
	Price "$5.85", FREE,
	Note "128 pages",
	Date "May 1981")

This thesis presents a new approach to the problem of database query
optimization, called @i<semantic query optimization>.  The goal of
semantic query optimization is to produce a semantically equivalent query
that is less expensive to process than the original query.  The knowledge
used to transform a query is the same knowledge used to insure the
@i<semantic integrity> of the data stored in the database.  A system
called @b<QUIST> (@b<QUery Improvement through Semantic Transformation>)
is described that carries out semantic query optimization for an important
class of relational database queries.  For many queries, QUIST can achieve
substantial reductions in the cost of retrieval.  It does this at a
negligible cost for its analysis by relying on heuristics based on
knowledge of database access and structure.

@Entry(Code "STAN-CS-81-858",
	Title "The Bit Operation Complexity of Approximate Evaluation of Matrix
and Polynomial Products Using Modular Arithmetic",
	Author "V. Ya. Pan",
	Price "$2.20", FREE,
	Note "6 pages",
	Date "June 1981")

The approximate evaluation with a given precision of matrix and polynomial products 
is performed using modular arithmetic.  The resulting algorithms are numerically
stable.  At the same time they are as fast as or faster than the algorithms with 
arithmetic operations over real or complex numbers.


@Entry(Code "STAN-CS-81-859",
	Title "The Additive and Logical Complexities of Linear and Bilinear
Arithmetic Algorithms",
	Author "V. Ya. Pan",
	Price "$2.65", FREE,
	Note "21 pages",
	Date "June 1981")

@blankspace(2.5 inch)

@Entry(Code "STAN-CS-81-860",
	Title "Computing the Controllability/Observability Decomposition of a Linear
Time-Invariant Dynamic System, A Numerical Approach",
	Author "Daniel Boley",
	Price "$4.45", FREE,
	Note "81 pages",
	Date "June 1981")

We examine various numerical properties involved in computing the complete 
Controllability-Observability (Kalman) Decomposition for a linear time-invariant
dynamic system, of the form
@begin(center)
x=@i[A]x + @i[B]u
y=@i[C]x
@end(center)

where @i[A, B, C] are matrices (@i[A] square), and  u, x, y  are vector functions of
time.  In particular, we discuss the numerical stability, the cost, and the
particular advantages of several algorithms.  We also examine several ways to
measure ill-conditioning in the data.


@Entry(Code "STAN-CS-81-861",
	Title "Symbolic Reasoning Among 3-D Models and 2-D Images",
	Author "Rodney A. Brooks",
	Price "$7.45", FREE,
	Note "181 pages",
	Date "June 1981")

An implemented and operational model-based vision system is described.
Examples are given of its interpretation of images, including extraction
of three-dimensional parameters from monocular images. Advances are
presented in representation for geometric modeling of objects and object
classes, in techniques for manipulating non-linear symbolic algebraic
constraints, in geometric reasoning in incompletely specified situations,
and in constructing algebraic constraints from image measurements.  Both
generic object classes and specific objects are represented by volume
models which are independent of viewpoint. Complex real-world object
classes are modeled.  Variations in size, structure and spatial relations
within object classes can be modeled. New spatial reasoning techniques are
described which are useful both for prediction within a vision system, and
for planning within a manipulation system. New approaches to prediction
and interpretation are introduced, based on the propagation of symbolic
constraints.  Predictions are two-pronged. First, prediction graphs
provide a coarse filter for hypothesizing matches of objects to image
features.  Second, prediction graphs contain instructions on how to use
measurements of image features to deduce three-dimensional information
about tentative object interpretations.  Interpretation proceeds by
merging local hypothesized matches, subject to consistent derived
implications about the size, structure and spatial configuration of the
hypothesized objects.  Prediction, description and interpretation proceed
concurrently, from coarse object subpart and class interpretations of
images, to fine distinctions among object subclasses, and more precise
three-dimensional quantification of objects.

@Entry(Code "STAN-CS-81-862",
	Title "The Lower Bounds on the Additive Complexity of Bilinear Problems
in Terms of Some Algebraic Quantities",
	Author "V. Ya. Pan",
	Price "$2.10", FREE,
	Note "4 pages",
	Date "June 1981")

The lower bounds on the additive complexity of a bilinear problem are expressed
in terms of the rank of the problem and also as a minimum number of elementary 
steps for the transformation of the identity matrix into a strongly regular one.


@Entry(Code "STAN-CS-81-863",
	Title "A Programming and Problem-Solving Seminar",
	Author "Donald E. Knuth and Allan A. Miller",
	Price "$4.50", FREE,
	Note "84 pages",
	Date "June 1981")

This report contains edited transcripts of the discussions held in
Stanford's course CS 204, Problem Seminar, during autumn quarter 1980.
Since the topics span a large range of ideas in computer science, and
since most of the important research paradigms and programming paradigms
came up during the discussions, the notes may be of use to graduate
students of computer science at other universities, as well as to their
professors and to professional people in the ``real world.''

The thesis also addresses an important issue in current automatic planning
research:  production not just of a correct solution but of a ``good''
one, by means of an efficient problem solver.  Semantic query optimization
advances the notion of a @i<problem reformulation> step for
problem-solving programs.  In this step, equivalent statements of the
original problem are sought, one of which may have a better solution than
the original problem.  This method avoids explicit and possibly costly
analysis of efficiency factors during planning itself.


@Entry(Code "STAN-CS-81-864",
	Title "Three Short Essays on Decisions, Reasons, and Logics",
	Author "Jon Doyle",
	Price "$2.60", FREE,
	Note "19 pages",
	Date "May 1981")

This report collects together three short, survey-level essays introducing related
approaches to decision-making, reasoning, and logic.  The first essay, @i[Making
Difficult Decisions,] was read to the Stanford Computer Forum on February 5, 1981.
The following two essays, @i[Dependencies and Assumptions] (written with Johan
de Kleer) and @i[Non-Deductive Reasoning and Non-Monotonic Logics,] are versions of
articles to appear in the @i[Handbook of Artificial Intelligence] (A. Barr, P. Cohen
and E. Feigenbaum, eds.).


@Entry(Code "STAN-CS-81-865",
	Title "Toward A Unified Logical Basis for Programming Languages",
	Author "Chih-sung Tang",
	Price "$2.65", not available
	Note "21 pages",
	Date "June 1981")

In recent years, more and more computer scientists have been paying attention to
temporal logic, since there are many properties of programs that can be described 
only by bringing the time parameter into consideration.  But existing temporal logic
languages, such as Lucid, in spite of their mathematical elegance, are still far 
from practical.  I believe that a practical temporal-logic language, once it came 
into being, would have a wide spectrum of applications.

XYZ/E is a temporal-logic language.  Like other logic languages, it is a logic
system as well as a programming language.  But unlike them, it can express all
conventional data structures and control structures, nondeterminate or concurrent
programs, even programs with branching-time order.  We find that the difficulties
met in other logic languages often stem from the fact that they try to deal with 
these structures in a higher level.  XYZ/E adopts another approach.  We divide the
language into two forms: the internal form and the external form.  The former is
lower level, while the latter is higher.  Just as any logic system contains rules
of abbreviation, so also in XYZ/E there are rules of abbreviation to transform the
internal form into the external form, and vice versa.  These two forms can be
considered to be different representations of the same thing.  We find that this
approach can ameliorate many problems of formalization.


@Entry(Code "STAN-CS-81-866",
Title "Verifying the Absence of Common Runtime Errors in Computer Programs",
	Author "Steven M. German",
	Price "$7.40", FREE,
	Note "179 pages",
	Date "June 1981")

The Runcheck verifier is a working prototype system for proving the absence of
runtime errors such as arithmetic overflow, array subscripting out of range,
accessing an uninitialized variable, and dereferencing a null pointer.  Such errors
cannot be detected at compile time by most compilers.  Runcheck accepts Pascal
programs documented with assertions and proves that the assertions are consistent
with the program and that no runtime errors can occur.

The thesis begins by presenting an axiomatic definition of Pascal for proving the
absence of runtime errors.  Our definition is similar to Hoare's axiom system, but
it takes into account certain restrictions which have not been considered in 
previous axiomatic definitions.  The definition is based on a special predicate,
DEF(x), which is true if  x  has a properly initialized value.  We discuss the
problem of introducing uninitialized variables in an axiomatic definition, and
construct models of the data types from nonstandard models of the integers to
justify our new approach to uninitialized variables.

The final section draws on experience with Runcheck and the Stanford Pascal Verifier
to discuss some of the major issues concerning verification and software reliability,
including how verification can contribute to reliability even if absolute 
correctness cannot be obtained, and which applications of program verification may 
be feasible for large programs.


@Entry(Code "STAN-CS-81-867",
	Title "ADAM - An Ada based Language for Multi-processing",
	Author "Luckham, Larsen, Stevenson and von Henke",
	Price "$4.15", FREE,
	Note "71 pages",
	Date "July 1981")

Adam is an experimental language derived from Ada.  It was developed to
facilitate study of issues in Ada implementation. The two primary objectives
which motivated the development of Adam were:  to program supervisory packages
for multitask scheduling, and to formulate algorithms for compilation of Ada
tasking.

Adam is a subset of the sequential program constructs of Ada combined with a
set of parallel processing constructs which are lower level than Ada tasking.
In addition, Adam places strong restrictions on sharing of global objects
between processes.  Import declarations and propagate declarations are
included.

A compiler has been implemented in Maclisp on a DEC PDP-10.  It produces
assembly code for a PDP-10.  It supports separate compilation, generics,
exceptions, and parallel processes.

Algorithms translating Ada tasking into Adam parallel processing have been
developed and implemented.  An experimental compiler for most of the final Ada
language design, including task types and task rendezvous constructs, based on
the Adam compiler, is presently available on PDP-10's.  This compiler uses a
procedure call implementation of task rendezvous, but will be used to develop
and study alternate implementations.


@Entry(Code "STAN-CS-81-868",
	Title "The Last Whole Errata Catalog",
	Author "Donald E. Knuth",
	Price "$3.25", FREE,
	Note "42 pages",
	Date "July 1981")

This list of errata and addenda to The Art of Computer Programming supplements
previous listings published in Stanford reports CS55l (1976) and CS7l2 (1979).
It includes the first corrections and changes to the second edition of volume two
(published January, 1981) as well as to the most recent prints of volumes one and
three (first published in 1975).  In addition to the errors listed here, about half
of the occurrences of `which' in volumes one and three should be changed to `that'.


ERRATA:

STAN-CS-81-836, Zohar Manna and Amir Pnueli, @i[Verification of Concurrent Programs,
Part I: The Temporal Framework]